Skip to content

Comments

Add support for OpenAI Responses API#1492

Open
Copilot wants to merge 8 commits intomainfrom
copilot/add-openai-responses-support
Open

Add support for OpenAI Responses API#1492
Copilot wants to merge 8 commits intomainfrom
copilot/add-openai-responses-support

Conversation

Copy link
Contributor

Copilot AI commented Jan 13, 2026

OpenAI recommends the Responses API (/v1/responses) over Chat Completions for new projects. This adds support for it across all OpenAI-related plugins while preserving existing Chat Completions behavior.

The Responses API uses input (array of role/content objects) instead of messages, and returns output array with structured content. Token usage fields differ: input_tokens/output_tokens vs prompt_tokens/completion_tokens.

Changes

OpenAIModels.cs

  • Added OpenAIResponsesRequest, OpenAIResponsesResponse, and supporting types
  • Extended TryGetOpenAIRequest to detect Responses API by checking for input array with objects containing role/type
  • Added TryGetCompletionLikeRequest shared method for plugins that only handle text generation
  • Extended OpenAIResponseUsage to deserialize both token naming conventions via property aliases

Plugin Updates

  • LanguageModelFailurePlugin: Injects fault prompts into Responses API input array
  • LanguageModelRateLimitingPlugin: Recognizes Responses API for rate limit tracking
  • OpenAITelemetryPlugin: Adds telemetry tags/metrics for Responses API operations
  • OpenAIMockResponsePlugin: Mocks Responses API by converting to/from chat completion format

Example Request Detection

// Responses API request structure
{
  "model": "gpt-4o",
  "input": [
    { "role": "user", "content": "Hello" }
  ]
}

// Detected via:
if (rawRequest.TryGetProperty("input", out var inputProperty) &&
    inputProperty.ValueKind == JsonValueKind.Array &&
    inputProperty.GetArrayLength() > 0)
{
    var firstItem = inputProperty.EnumerateArray().First();
    if (firstItem.TryGetProperty("role", out _) || firstItem.TryGetProperty("type", out _))
        // Responses API request
}

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • platform.openai.com
    • Triggering command: /home/REDACTED/work/_temp/ghcca-node/node/bin/node /home/REDACTED/work/_temp/ghcca-node/node/bin/node --enable-source-maps /home/REDACTED/work/_temp/copilot-developer-action-main/dist/index.js (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>Add support for OpenAI Responses API</issue_title>
<issue_description>OpenAI recommends the use of Responses API instead of Chat Completions for new projects. Let's add support for it for all OpenAI-related plugins so that we support both chat completions and responses. More information https://platform.openai.com/docs/guides/migrate-to-responses

Update:

  • LanguageModelFailurePlugin
  • LanguageModelRateLimitingPlugin
  • OpenAITelemetryPlugin
  • OpenAIUsageDebuggingPlugin
  • OpenAIMockResponsePlugin

Ensure that existing functionality keeps working unaltered.</issue_description>

Comments on the Issue (you are @copilot in this section)


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 3 commits January 13, 2026 13:24
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
…e ID prefix

Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for OpenAI Responses API Add support for OpenAI Responses API Jan 13, 2026
Copilot AI requested a review from waldekmastykarz January 13, 2026 13:35
@waldekmastykarz waldekmastykarz marked this pull request as ready for review February 22, 2026 09:34
@waldekmastykarz waldekmastykarz requested a review from a team as a code owner February 22, 2026 09:34
Copilot AI review requested due to automatic review settings February 22, 2026 09:34
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds support for the OpenAI Responses API (/v1/responses) to complement the existing Chat Completions API support. The Responses API is OpenAI's recommended approach for new projects. The implementation extends all relevant OpenAI-related plugins to handle both API formats while maintaining backward compatibility.

Changes:

  • Extended OpenAI model classes to support Responses API request/response structures with property aliases for token usage fields
  • Added TryGetCompletionLikeRequest helper method for plugins that only handle text generation APIs
  • Updated four plugins (LanguageModelFailurePlugin, LanguageModelRateLimitingPlugin, OpenAITelemetryPlugin, OpenAIMockResponsePlugin) to recognize and process Responses API requests

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 12 comments.

Show a summary per file
File Description
DevProxy.Abstractions/LanguageModel/OpenAIModels.cs Adds Responses API model classes, detection logic in TryGetOpenAIRequest, new TryGetCompletionLikeRequest helper, and property aliases in OpenAIResponseUsage for token field mapping
DevProxy.Plugins/Behavior/LanguageModelFailurePlugin.cs Adds fault injection support for Responses API by appending fault prompts to the input array
DevProxy.Plugins/Behavior/LanguageModelRateLimitingPlugin.cs Switches to TryGetCompletionLikeRequest to include Responses API in rate limit tracking
DevProxy.Plugins/Inspection/OpenAITelemetryPlugin.cs Adds telemetry tags/metrics collection for Responses API operations including request parameters and response status
DevProxy.Plugins/Mocking/OpenAIMockResponsePlugin.cs Implements Responses API mocking by converting between Responses format and Chat Completion format for the local LM

- Extract shared IsResponsesApiRequest helper to deduplicate detection logic
- Remove unused OpenAIResponsesUsage class
- Add missing base properties in LanguageModelFailurePlugin Responses API branch
- Fix complex content formatting in OpenAITelemetryPlugin
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for OpenAI Responses API

2 participants